Sample-Efficient L0-L2 Constrained Structure Learning of Sparse Ising Models

نویسندگان

چکیده

We consider the problem of learning underlying graph a sparse Ising model with p nodes from n i.i.d. samples. The most recent and best performing approaches combine an empirical loss (the logistic regression or interaction screening loss) regularizer (an L1 penalty constraint). This results in convex that can be solved separately for each node graph. In this work, we leverage cardinality constraint L0 norm, which is known to properly induce sparsity, further it L2 norm better non-zero coefficients. show our proposed estimators achieve improved sample complexity, both (a) theoretically, by reaching new state-of-the-art upper bounds recovery guarantees, (b) empirically, showing sharper phase transitions between poor full topologies studied literature, when compared their L1-based methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Interaction Screening: Efficient and Sample-Optimal Learning of Ising Models

We consider the problem of learning the underlying graph of an unknown Ising model on p spins from a collection of i.i.d. samples generated from the model. We suggest a new estimator that is computationally efficient and requires a number of samples that is near-optimal with respect to previously established information-theoretic lower-bound. Our statistical estimator has a physical interpretat...

متن کامل

Learning Efficient Structured Sparse Models

We present a comprehensive framework for structured sparse coding and modeling extending the recent ideas of using learnable fast regressors to approximate exact sparse codes. For this purpose, we propose an efficient feed forward architecture derived from the iteration of the block-coordinate algorithm. This architecture approximates the exact structured sparse codes with a fraction of the com...

متن کامل

Structure learning of antiferromagnetic Ising models

In this paper we investigate the computational complexity of learning the graph structure underlying a discrete undirected graphical model from i.i.d. samples. Our first result is an unconditional computational lower bound of (pd/2) for learning general graphical models on p nodes of maximum degree d, for the class of so-called statistical algorithms recently introduced by Feldman et al. [1]. T...

متن کامل

Learning Sparse Neural Networks through L0 Regularization

We propose a practical method for L0 norm regularization for neural networks: pruning the network during training by encouraging weights to become exactly zero. Such regularization is interesting since (1) it can greatly speed up training and inference, and (2) it can improve generalization. AIC and BIC, well-known model selection criteria, are special cases of L0 regularization. However, since...

متن کامل

High-Dimensional Robust Structure Learning of Ising Models on Sparse Random Graphs

This paper considers structure learning of ferromagnetic Ising models Markov on sparse ErdősRényi random graphs with constant average degree c > 0. We propose simple, local and robust algorithms and analyze their performances in the regime of correlation decay, i.e., when c tanhJmax < 1 (where Jmax is the maximum inverse temperature in the model). The algorithms are robust because (i) they do n...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i8.16884